Processing Multiple Signals with input_variants()

When you have multiple signals (from different files, sensors, or test scenarios) that need the same processing, use `.input_variants()` to run them all through the same pipeline.

Method 1: Process Multiple Input Signals


from sigexec import Graph
from sigexec.blocks import RangeCompress, DopplerCompress
from sigexec.core.data import SignalData

# Load or generate different signals
signal_dataset_a = load_signal("dataset_a.bin")
signal_dataset_b = load_signal("dataset_b.bin")  
signal_dataset_c = load_signal("dataset_c.bin")

# Process all three through the same pipeline
results = (Graph()
    .input_variants([signal_dataset_a, signal_dataset_b, signal_dataset_c],
                   names=['Dataset A', 'Dataset B', 'Dataset C'])
    .add(RangeCompress(window='hamming'))
    .add(DopplerCompress(window='hann'))
    .run()
)

# Results contains one (params, result) tuple for each input signal
for params, result in results:
    dataset_name = params['variant'][0]
    print(f"{dataset_name}: peak at {find_peak(result)}")

Live Example: Three Different Target Scenarios

Processing three radar scenarios with different target parameters through the same range/Doppler compression pipeline.

Method 2: Lazy Loading with Variants

For large datasets or many files, you don't want to load everything into memory at once. Use `.variants()` with a loader factory to load data lazily during pipeline execution.


from sigexec import Graph
from sigexec.core.data import SignalData
import numpy as np

# Create a loader factory - data is loaded only when the variant executes
def make_loader(filename):
    def load(_):
        data = np.load(filename)  # Loaded on demand, not upfront
        return SignalData(data, sample_rate=20e6)
    return load

# Process multiple files through the same pipeline
# Each file is loaded only when needed, one at a time
results = (Graph()
    .variants(make_loader, 
              ['dataset_a.npy', 'dataset_b.npy', 'dataset_c.npy'],
              names=['Dataset A', 'Dataset B', 'Dataset C'])
    .add(StackPulses())
    .add(RangeCompress(window='hamming'))
    .add(DopplerCompress(window='hann'))
    .run()
)

# Results contains one (params, result) tuple for each file
for params, result in results:
    dataset_name = params['variant'][0]
    print(f"Processed {dataset_name}")

Live Example: Lazy Loading from Saved Files

First, generate and save three different target scenarios to files:

✓ Saved 3 signal files to temporary directory

\nNow load and process them lazily - one file at a time:

Method 3: Combine Lazy Loading with Processing Variants

Combine lazy-loaded data variants with processing parameter variants to explore the full cartesian product without loading all data into memory at once.


from sigexec import Graph

# Loader factory
def make_loader(filename):
    def load(_):
        data = np.load(filename)
        return SignalData(data, sample_rate=20e6)
    return load

# 3 files × 2 range windows × 2 Doppler windows = 12 total combinations
# But only one file is in memory at a time!
results = (Graph()
    .variants(make_loader, 
              ['sig_a.npy', 'sig_b.npy', 'sig_c.npy'],
              names=['Signal A', 'Signal B', 'Signal C'])
    .add(StackPulses())
    .variants(lambda w: RangeCompress(window=w), 
              ['hamming', 'blackman'],
              names=['Hamming', 'Blackman'])
    .variants(lambda w: DopplerCompress(window=w), 
              ['hann', 'hamming'],
              names=['Hann', 'Hamming'])
    .run()
)

# Access all three levels of variants
for params, result in results:
    signal_name = params['variant'][0]
    range_window = params['variant'][1]
    doppler_window = params['variant'][2]
    print(f"{signal_name} + Range:{range_window} + Doppler:{doppler_window}")

Live Example: 2 Files × 2 Range × 2 Doppler = 8 Combinations

Generating and saving 2 signals, then loading lazily with processing variants:

Signal Range Window Doppler Window Peak Value
Target 1 Hamming Hann 600.6
Target 1 Hamming Hamming 648.5
Target 1 Blackman Hann 466.4
Target 1 Blackman Hamming 503.6
Target 2 Hamming Hann 611.2
Target 2 Hamming Hamming 663.2
Target 2 Blackman Hann 475.0
Target 2 Blackman Hamming 515.3

Summary

Key benefits of lazy loading with `.variants()`: 1. **Memory Efficient**: Only one signal in memory at a time 2. **Scalable**: Process hundreds of files without memory issues 3. **Flexible**: Combine with processing variants for full exploration 4. **Consistent Processing**: Same pipeline applied to all data 5. **Easy Pattern**: Just wrap your loader in a factory function Pattern to remember: ```python def make_loader(filename): def load(_): # Load happens here, during execution data = load_from_somewhere(filename) return SignalData(data, sample_rate=...) return load results = Graph().variants(make_loader, file_list, names=...).add(...).run() ``` Use cases: - Processing large datasets that don't fit in memory - Batch processing many files from disk or network - Testing algorithms across multiple scenarios - Comparing data from different sensors or time periods